Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
This compressed tarball archive contains the datasets and scripts necessary to visualize the residence time distributions, travel time distributions, and storage selection functions for the Fourth of July Creek transient simulations. The scripts and datasets are formatted as Matlab m-file scripts and MAT archives.more » « less
-
{"Abstract":["The compressed tarball archive contains the simulation files for the Fourth of July Creek basin in the White Clouds mountains of central Idah0, USA under wet, normal, and dry flow conditions using reconstructed, transient inputs. This archive builds on the "Fourth of July Creek Ensemble Simulation Outputs" (doi: 10.7273/000004796) by adding consideration of fully transient conditions and the associated uncertainty. Combined with that archive, all the files necessary to reproduce the runs are provided here. The runs are computationally demanding so this archive also contains the minimally processed datasets of the large outputs including pressure heads and/or streamflow at the monitoring locations and streamflow along the main stem of the creek. Contained in the "Output" folder is the dataset (saved as a Matlab data object) and scripts to plot the responses at monitoring locations or along the main stem."]}more » « less
-
{"Abstract":["This compressed tarball archive contains the Matlab datasets of the Base-case (denoted SR for "Standard run"), wet case (WR) and dry case (DR) for the transient residence time and travel time distributions and a Matlab script that plots them for the Zonal conceptual model or any of the 250 realizations in the stochastic ensemble. The details of the flow simulations that led to these distributions can be found by searching for "Fourth of July Creek" in the research archive and also through Dr. Engdahl's Google Scholar page."]}more » « less
-
Free, publicly-accessible full text available February 18, 2026
-
These are the simulation outputs for an ensemble simulation of the expected response of the Fourth of July Basin in the White Clouds Mountains of central Idaho. The three files represent a base-case recharge scenario and two perturbations of +/-10%. All files inside the compressed tarball archives are ParFlow binary, a description of which is available at https://parflow.readthedocs.io/en/latest/index.html and https://github.com/parflow/parflow. Each realization in the ensemble simulation contains the steady-state pressure field, velocity field components, and the associated permeability field.more » « less
-
Modern hydrologic models have extraordinary capabilities for representing complex process in surface-subsurface systems. These capabilities have revolutionized the way we conceptualize flow systems, but how to represent uncertainty in simulated flow systems is not as well developed. Currently, characterizing model uncertainty can be computationally expensive, in part, because the techniques are appended to the numerical methods rather than seamlessly integrated. The next generation of computers, however, presents opportunities to reformulate the modeling problem so that the uncertainty components are handled more directly within the flow system simulation. Misconceptions about quantum computing abound and they will not be a “silver bullet” for solving all complex problems, but they might be leveraged for certain kinds of highly uncertain problems, such as groundwater (GW). The point of this issue paper is that the GW community could try to revise the foundations of our models so that the governing equations being solved are tailored specifically for quantum computers. The goal moving forward should not just be to accelerate the models we have, but also to address their deficiencies. Embedding uncertainty into the models by evolving distribution functions will make predictive GW modeling more complicated, but doing so places the problem into a complexity class that is highly efficient on quantum computing hardware. Next generation GW models could put uncertainty into the problem at the very beginning of a simulation and leave it there throughout, providing a completely new way of simulating subsurface flows.more » « less
-
Abstract. Lagrangian particle tracking schemes allow a wide range of flow and transport processes to be simulated accurately, but a major challenge is numerically implementing the inter-particle interactions in an efficient manner. This article develops a multi-dimensional, parallelized domain decomposition (DDC) strategy for mass-transfer particle tracking (MTPT) methods in which particles exchange mass dynamically. We show that this can be efficiently parallelized by employing large numbers of CPU cores to accelerate run times. In order to validate the approach and our theoretical predictions we focus our efforts on a well-known benchmark problem with pure diffusion, where analytical solutions in any number of dimensions are well established. In this work, we investigate different procedures for “tiling” the domain in two and three dimensions (2-D and 3-D), as this type of formal DDC construction is currently limited to 1-D. An optimal tiling is prescribed based on physical problem parameters and the number of available CPU cores, as each tiling provides distinct results in both accuracy and run time. We further extend the most efficient technique to 3-D for comparison, leading to an analytical discussion of the effect of dimensionality on strategies for implementing DDC schemes. Increasing computational resources (cores) within the DDC method produces a trade-off between inter-node communication and on-node work.For an optimally subdivided diffusion problem, the 2-D parallelized algorithm achieves nearly perfect linear speedup in comparison with the serial run-up to around 2700 cores, reducing a 5 h simulation to 8 s, while the 3-D algorithm maintains appreciable speedup up to 1700 cores.more » « less
An official website of the United States government

Full Text Available